Project #03 Multimodal Guitar: Performance Toolbox and Study Workbench

نویسندگان

  • Christian Frisson
  • Wen-Yang Chu
  • Otso Lähdeoja
  • Cécile Picard
  • Ao Shen
  • Todor Todoroff
چکیده

This project aims at studying how recent interactive and interaction technologies would help extend how we play the guitar, thus defining the “multimodal guitar”. We investigate two axes, 1) “A gestural/polyphonic sensing/processing toolbox to augment guitar performances”, and 2) “An interactive guitar score following environment for adaptive learning”. These approaches share quite similar technological challenges (sensing, analysis, processing, synthesis and interaction methods) and dissemination intentions (community-based, low-cost, open-source whenever possible), while leading to different applications (respectively artistic and educational), still targeted towards experienced players and beginners. We designed and developed a toolbox for multimodal guitar performances containing the following tools: Polyphonic Pitch Estimation (see section III-A1), Fretboard Grouping (see section III-A2), Rear-mounted Pressure Sensors (see section III-B), Infinite Sustain (see section III-C2), Rearranging Looper (see section III-C3), Smart Harmonizer (see section III-C4). The Modal Synthesis tool (see section III-C1) needs be refined before being released. We designed a low-cost offline system for guitar score following (see section IV-A). An audio modality, polyphonic pitch estimation from a monophonic audio signal, is the main source of the information (see section IV-B), while the visual input modality, finger and headstock tracking using computer vision techniques on two webcams, provides the complementary information (see section IV-C). We built a stable data acquisition approach towards low information loss (see section IV-E). We built a probability-based fusion scheme so as to handle missing data; and unexpected or misinterpreted results from single modalities so as to have better multi-pitch transcription results (see section IV-D). We designed a visual output modality so as to visualize simultaneously the guitar score and feedback from the score following evaluation (see section IV-F). The audio modality and parts of the visual input modality are already designed to run in realtime, we need to improve the multimodal fusion and visualization so that the whole system can run in realtime.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multimodal Guitar: Performance Toolbox and Study Workbench

This project aims at studying how recent interactive and interaction technologies would help extend how we play the guitar, thus defining the “multimodal guitar”. We investigate two axes, 1) “A gestural/polyphonic sensing/processing toolbox to augment guitar performances”, and 2) “An interactive guitar score following environment for adaptive learning”. These approaches share quite similar tech...

متن کامل

Multimodal Guitar: A Toolbox For Augmented Guitar Performances

This project aims at studying how recent interactive and interactions technologies would help extend how we play the guitar, thus defining the “multimodal guitar”. Our contributions target three main axes: audio analysis, gestural control and audio synthesis. For this purpose, we designed and developed a freely-available toolbox for augmented guitar performances, compliant with the PureData and...

متن کامل

A multimodal tempo and beat-tracking system based on audiovisual information from live guitar performances

The aim of this paper is to improve beat-tracking for live guitar performances. Beat-tracking is a function to estimate musical measurements, for example musical tempo and phase. This method is critical to achieve a synchronized ensemble performance such as musical robot accompaniment. Beat-tracking of a live guitar performance has to deal with three challenges: tempo fluctuation, beat pattern ...

متن کامل

Estimation of Guitar Fingering and Plucking Controls based on Multimodal Analysis of Motion, Audio and Musical Score

This work presents a method for the extraction of instrumental controls during guitar performances. The method is based on the analysis of multimodal data consisting of a combination of motion capture, audio analysis and musical score. High speed video cameras based on marker identification are used to track the position of finger bones and articulations and audio is recorded with a transducer ...

متن کامل

A Mobile Wireless Augmented Guitar

We present the design of a mobile augmented guitar based on traditional playing, combined with gesture-based continuous control of audio processing. Remote sound processing is enabled through our dynamically reconfigurable low-latency high-fidelity audio streaming protocol included in a mobile wearable wireless platform. Initial results show the suitability audio and sensors data forwarding ove...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015